home
***
CD-ROM
|
disk
|
FTP
|
other
***
search
/
Columbia Kermit
/
kermit.zip
/
bench
/
x.txt
/
000032_fdc@watsun.cc.columbia.edu_Fri Oct 5 10:37:37 EDT 2001.msg
< prev
next >
Wrap
Text File
|
2020-01-01
|
5KB
|
151 lines
Article: 12840 of comp.protocols.kermit.misc
Path: newsmaster.cc.columbia.edu!watsun.cc.columbia.edu!fdc
From: fdc@watsun.cc.columbia.edu (Frank da Cruz)
Newsgroups: comp.unix.aix,comp.protocols.kermit.misc
Subject: Re: ftp download with help of a file
Date: 5 Oct 2001 14:35:38 GMT
Organization: Columbia University
Lines: 134
Message-ID: <9pkgfq$ipf$1@newsmaster.cc.columbia.edu>
References: <5e23bad3.0110050335.3c8e127@posting.google.com>
NNTP-Posting-Host: watsun.cc.columbia.edu
X-Trace: newsmaster.cc.columbia.edu 1002292538 19247 128.59.39.2 (5 Oct 2001 14:35:38 GMT)
X-Complaints-To: postmaster@columbia.edu
NNTP-Posting-Date: 5 Oct 2001 14:35:38 GMT
Xref: newsmaster.cc.columbia.edu comp.unix.aix:223793 comp.protocols.kermit.misc:12840
In article <5e23bad3.0110050335.3c8e127@posting.google.com>,
antonio <antonio.napoleone@bedag.ch> wrote:
: We download datfiles for virusengines via ftp. Usually we are doing this
: automatically, but since a while we are having problems by getting several
: files in the same remotedirectory with the mget-command. In some
: directorys, downloading of those files works fine but in other not. Now i
: try to download file by file, but i'm not very experienced in doing this.
: What i do is getting the FILELIST from the remoteserver and figure out the
: filenames with help of the grep and awk commands. Now my list looks like
: this:
:
: AVH32DLL.DL_
: VIRSIG.DA_
: VIRINFO.DA_
: README.TX_
:
: Is it possible to connect to the ftp-host, get each filename from the
: list, download it and close the connection, after i dowloaded every
: file, or do i have to connect, get 1 file, close, and so on.
:
This would require a bit more flexibility than you'll find in the regular
FTP client. C-Kermit 8.0:
http://www.columbia.edu/kermit/ck80.html
includes a new scriptable FTP client that will let you do this:
http://www.columbia.edu/kermit/ftpclient.html
A scripting tutorial is here:
http://www.columbia.edu/kermit/ckscripts.html
And an FTP-specific scripting tutorial is here:
http://www.columbia.edu/kermit/ftpscripts.html
And complete documentation is here:
http://www.columbia.edu/kermit/ckermit3.html#x3
Here is a script that gets the list of filenames:
cd somelocaldirectory
ftp open foo.bar.com /user:myname /password:secret
if fail exit 1 Can't reach host
if not \v(ftp_loggedin) exit 1 FTP login failed
ftp cd blah/blah/somepath
if fail exit 1 Directory change failed
ftp get /namelist:mylist
if fail exit 1 Can't get list of filenames
ftp bye
Obviously we don't recommend putting passwords in scripts; better methods
are available. The method shown above was chosen for brevity. If you
are using anonymous ftp, the command would be:
ftp open foo.bar.com /anonymous
Now you have the list of filenames in the local file called 'mylist'.
At this point, you should consider what you want to do with it. One
strategy, as you suggest, is to open a new FTP session for each file.
This can be done as follows (still in Kermit):
fopen /read \%c myfile
if fail exit 1 Can't open file list
while not \f_eof(\%c) {
fread /line \%c filename
if fail break
ftp open foo.bar.com /user:myname /password:secret
if fail exit 1 Can't reach host
if not \v(ftp_loggedin) exit 1 FTP login failed
ftp cd blah/blah/somepath
ftp get \m(filename)
if fail exit 1 \m(filename): Download failed
ftp bye
}
This is a sort of brute-force attack, and I'm not sure it does what you
want anyway. What happens if a download fails on a particular file?
Here is a more elegant solution:
cd somelocaldirectory
delete *
ftp open foo.bar.com /user:myname /password:secret
if fail exit 1 Can't reach host
if not \v(ftp_loggedin) exit 1 FTP login failed
ftp cd blah/blah/somepath
if fail exit 1 Directory change failed
while true {
ftp get /update *
if success break
}
ftp bye
Here we clean out any old copies of the files, make the FTP connection to
the server, cd to the desired server directory, and ask it to send us all
the files in update mode. This means: if I already have a current copy of
a file, don't bother to send it, but if I don't, then please do send it.
If this succeeds, we're done. If it fails, we try again, automatically
skipping the files that were sent previously, and so on until all the files
have been sent.
We can make this script both more robust and more efficient:
cd somelocaldirectory
delete *
while true {
ftp open foo.bar.com /user:myname /password:secret
if fail exit 1 Can't reach host
if not \v(ftp_loggedin) exit 1 FTP login failed
ftp cd blah/blah/somepath
if fail exit 1 Directory change failed
while true {
ftp get /recover /update *
if success goto done
if not \v(ftp_connected) break
}
ftp bye
}
done:
This allows for the case when the connection is lost. When this happens,
the script automatically goes back and reestablishes the connection and
restarts the download; if the connection is not lost, however, it does not
needlessly break the connection and reestablish it. In case a long file was
interrupted in the middle, the /RECOVER option makes the download resume
from the point of failure.
- Frank